293 research outputs found

    The legalities and politics of health informatics

    Get PDF

    Privacy as personal resistance: exploring legal narratology and the need for a legal architecture for personal privacy rights

    Get PDF
    Different cultures produce different privacies – both architecturally and legally speaking – as well as in their different legal architectures. The ‘Simms principle’ can be harnessed to produce semi-constitutional privacy protection through statute; building on the work already done in ‘bringing rights home’ through the Human Rights Act 1998. This article attempts to set out a notion of semi-entrenched legal rights, which will help to better portray the case for architectural, constitutional privacy, following an examination of the problems with a legal narrative for privacy rights as they currently exist. I will use parallel ideas from the works of W.B. Yeats and Costas Douzinas to explore and critique these assumptions and arguments. The ultimate object of this piece is an argument for the creation of a legal instrument, namely an Act of Parliament, in the United Kingdom; the purpose of which is to protect certain notions of personal privacy from politically-motivated erosion and intrusion

    'Algorithmic impropriety' in UK policing?

    Get PDF
    There are concerns that UK policing could soon be awash with 'algorithmic impropriety'. Big(ger) data and machine learning-based algorithms combine to produce opportunities for better intelligence-led management of offenders, but also creates regulatory risks and some threats to civil liberties - even though these can be mitigated. In constitutional and administrative law terms, the use of predictive intelligence analysis software to serve up 'algorithmic justice' presents varying human rights and data protection problems based on the manner in which the output of the tool concerned is deployed. But regardless of exact context, in all uses of algorithmic justice in policing there are linked fears; of risks around potential fettering of discretion, arguable biases, possible breaches of natural justice, and troubling failures to take relevant information into account. The potential for 'data discrimination' in the growth of algorithmic justice is a real and pressing problem. This paper seeks to set out a number of arguments, using grounds of judicial review as a structuring tool, that could be deployed against algorithmically-based decision making processes that one might conceivably object to when encountered in the UK criminal justice system. Such arguments could be used to enhance and augment data protection and/or human rights grounds of review, in this emerging algorithmic era, for example, if a campaign group or an individual claimant were to seek to obtain a remedy from the courts in relation to a certain algorithmically-based decision-making process or outcome

    Countering extremism and recording dissent: intelligence analysis and the Prevent agenda in UK Higher Education

    Get PDF
    Despite growing calls in the media and from various political quarters for a review of the effectiveness of the Prevent duty, the Home Secretary has recently announced that the duty will essentially continue largely unchanged in its operation, including in UK higher education (HE). This is perhaps because the High Court in the recent case of Butt has found that the Prevent Duty guidance to Universities is lawful, and no breach of freedom of expression or of privacy rights. But since the Prevent duty involves sharing, where necessary and proportionate, highly sensitive personal information about individuals at risk of being drawn into extremism and terrorism, this 'forward thinking' piece addresses some of the complexities of information law dealt with in the case. Some thoughts are offered about the missed opportunity that a rejection of a review of Prevent would entail

    Better information sharing, or 'share or be damned'?

    Get PDF
    Safeguarding, and the information sharing between professionals and bodies which underpins it, is crucial for the prevention of harm to the vulnerable. But sometimes it is worth exploring the 'hard cases', where safeguarding practices might ultimately prove troublesome themselves, on rarer occasions. As Sue Peckover (2013) has highlighted, a key idea is that sometimes we respond to risks of harm in an overly bureaucratic or otherwise superficial way because we can't find more resources to intervene most effectively and change risky behaviours presented by (actual potential or alleged) offenders or abusers. This is a theoretical and policy analysis-based piece that aims to prompt some questions for readers as to the flourishing culture of information sharing, and the growing body of public policy in relation to public protection disclosures. The piece also offers up some conclusions on new 'naming and shaming' strategies, as part of the "public protection routine" (Grace, 2013b); which is the multi-agency work of adult and child safeguarding, in essence. This 'direction of travel' toward increasing the ways in which knowledge about risk (and 'risky people') is spread around communities is the creation of what I would call a culture of 'share or be damned' for professionals to navigate. In this way, multi-agency information sharing and disclosures of information to the public (for safeguarding purposes) are an element of what Mike Nash has articulated as the 'politics of public protection' (Nash, 2010). This 'politics of public protection' can be summed up as the social, cultural and policy pressures which affect decision- making in the public protection and safeguarding contexts

    Lessons on legislating for public protection information sharing: A case commentary on Christian Institute v Lord Advocate [2016] UKSC 51

    Get PDF
    The decision of the UK Supreme Court in Christian Institute temporarily undermines the Scottish initiative referred to as the named persons scheme - a programme of information governance that will likely (in time) facilitate the sharing of information (and impliedly sensitive personal data) about vulnerable children in Scotland between different child welfare and public protection agencies. The Scottish government already have a blueprint, however, of the legal shortcomings of the relevant information sharing provisions of Part 4 of the Children and Young Persons 9Scotland) Act 2014, since the judgment of the Supreme Court in Christian Institute is very clear and precise in delineating exactly which information rights are lacking in specificity in the scheme as originally legislated by the Scottish Parliament at Holyrood. This case commentary piece seeks to place the decision in Christian Institute in a wider context with regard to the way the courts treat challenges to public protection information sharing on the basis of European data protection law or Article 8 of the European Convention on Human Rights
    • 

    corecore